Loss Function

Perceptron

$$ \begin{aligned} \ell(y, p) &= -p \cdot y, &p = h(\underline{w} \cdot \underline{x}) \end{aligned} $$

Cross-Entropy

$$ \begin{aligned} \ell(\underline{y}, \underline{p}) &= \sum_{c=1}^{C} 1_{[y_c = 1]} \cdot \log \frac{1}{p_c} \\ &= \underline{y} \odot \log \frac{1}{\underline{p}} \end{aligned} $$

Expand for special case:

$$ \begin{aligned} \ell(y, \sigma(z)) &= y \cdot \log \frac{1}{\sigma(z)} + (1 - y) \cdot \log \frac{1}{1 - \sigma(z)} \end{aligned} $$ $$ \begin{aligned} \ell(\underline{y}, \underline{\gamma}) &= \sum_{c=1}^{C} y_c \cdot \log \frac{1}{\text{softmax}(\underline{\gamma})_{c}} \end{aligned} $$

by Jon